Optimized Approach (SPCA) for Load Balancing in Distributed HDFS Cluster
نویسندگان
چکیده
منابع مشابه
A Framework for Distributed Dynamic Load Balancing in Heterogeneous Cluster
Distributed Dynamic load balancing (DDLB) is an important system function destined to distribute workload among available processors to improve throughput and/or execution times of parallel computer in Cluster Computing. Instead of balancing the load in cluster by process migration, or by moving an entire process to a less loaded computer, we make an attempt to balance load by splitting process...
متن کاملA Novel Adaptive Distributed Load Balancing Strategy for Cluster
* This paper is supported by National 863 Hi-Tech R&D Project under grant No.2002AA1Z2102 Abstract with the dramatically increasing of users accessing Internet services, many researchers employed cluster server to provide their Internet services and proposed all kinds of scheduling strategies to address the imbalance among real servers. However, most of them are based on TCP connections and pay...
متن کاملA Modified Approach for Load Balancing in Cluster Computing
The needs and expectations of modern-day applications are changing in the sense that they not only need computing resources (be they processing power, memory or disk space), but also the ability to remain available to service user requests almost constantly 24 hours a day and 365 days a year. These needs and expectations of today’s applications result in challenging research and development eff...
متن کاملAchieving Load Balancing of HDFS Clusters Using Markov Model
The combination of Hadoop and HDFS is becoming a defacto standard system in handling big data. HDFS is a distributed file system that is designed for big data. In HDFS, a file consists of multiple large sized blocks. A central management of HDFS tries to scatter these multiple blocks on different nodes to maximize the I/O throughput. Hadoop is a framework that supports data intensive parallel a...
متن کاملAn Optimized Approach for Processing Small Files in HDFS
In Today’s world cloud storage, has become an important part of the cloud computing system. Hadoop is an open-source software for computing huge number of data sets to facilitate storage, analyze,manage and access functionality in distributed systems across huge number of systems. Many of the user created data are of small files. HDFS is a distributed file system that manages the file processin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SN Computer Science
سال: 2020
ISSN: 2662-995X,2661-8907
DOI: 10.1007/s42979-020-0107-8